56 research outputs found

    The size-brightness correspondence:evidence for crosstalk among aligned conceptual feature dimensions

    Get PDF
    The same core set of cross-sensory correspondences connecting stimulus features across different sensory channels are observed regardless of the modality of the stimulus with which the correspondences are probed. This observation suggests that correspondences involve modality-independent representations of aligned conceptual feature dimensions, and predicts a size-brightness correspondence, in which smaller is aligned with brighter. This suggestion accommodates cross-sensory congruity effects where contrasting feature values are specified verbally rather than perceptually (e.g., where the words WHITE and BLACK interact with the classification of high and low pitch sounds). Experiment 1 brings these two issues together in assessing a conceptual basis for correspondences. The names of bright/white and dark/black substances were presented in a speeded brightness classification task in which the two alternative response keys differed in size. A size-brightness congruity effect was confirmed, with substance names classified more quickly when the relative size of the response key needing to be pressed was congruent with the brightness of the named substance (e.g., when yoghurt was classified as a bright substance by pressing the smaller of two keys). Experiment 2 assesses the proposed conceptual basis for this congruity effect by requiring the same named substances to be classified according to their edibility (with all of the bright/dark substances having been selected for their edibility/inedibility, respectively). The predicted absence of a size-brightness congruity effect, along with other aspects of the results, supports the proposed conceptual basis for correspondences and speaks against accounts in which modality-specific perceptuomotor representations are entirely responsible for correspondence-induced congruity effects

    Summation of visual attributes in auditory‐visual crossmodal

    Get PDF
    Crossmodal correspondences are a feature of human perception in which two or more sensory dimensions are linked together; for example, high‐pitched noises may be more readily linked with small objects than large objects. However, no study yet has systematically examined the interaction between different visual‐auditory crossmodal correspondences. We investigated how the visual dimensions of luminance, saturation, size and vertical position can influence decisions when matching particular visual stimuli with high‐pitched or low‐pitched auditory stimuli. For multi‐dimensional stimuli, we found a general pattern of summation of individual crossmodal correspondences, with some exceptions that may be explained by Garner interference. These findings have applications for the design of sensory substitution systems, which convert information from one sensory modality to another

    A brain-inspired cognitive system that mimics the dynamics of human thought

    Get PDF
    In recent years, some impressive AI systems have been built that can play games and answer questions about large quantities of data. However, we are still a very long way from AI systems that can think and learn in a human-like way. We have a great deal of information about how the brain works and can simulate networks of hundreds of millions of neurons. So it seems likely that we could use our neuroscientific knowledge to build brain-inspired artificial intelligence that acts like humans on similar timescales. This paper describes an AI system that we have built using a brain-inspired network of artificial spiking neurons. On a word recognition and colour naming task our system behaves like human subjects on a similar timescale. In the longer term, this type of AI technology could lead to more flexible general purpose artificial intelligence and to more natural human-computer interaction

    Crossmodal correspondences: A tutorial review

    Full text link

    Auditory event-related potentials

    Get PDF
    Auditory event related potentials are electric potentials (AERP, AEP) and magnetic fields (AEF) generated by the synchronous activity of large neural populations in the brain, which are time-locked to some actual or expected sound event
    corecore